Deep Coding Network

نویسندگان

  • Yuanqing Lin
  • Tong Zhang
  • Shenghuo Zhu
  • Kai Yu
چکیده

This paper proposes a principled extension of the traditional single-layer flat sparse coding scheme, where a two-layer coding scheme is derived based on theoretical analysis of nonlinear functional approximation that extends recent results for local coordinate coding. The two-layer approach can be easily generalized to deeper structures in a hierarchical multiple-layer manner. Empirically, it is shown that the deep coding approach yields improved performance in benchmark datasets.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

ظرفیت شبکه های حذفی تحت کدینگ فضایی شبکه

     We study the capacity of point-to-point erasure networks under a restricted form of network coding to which we refer as spatial network coding. In this form of coding, the nodes can not perform coding on successive packets which are received from one incoming link. The coding at a node is restricted to the packets received at the same time slot from different incoming links...

متن کامل

AI Oriented Large-Scale Video Management for Smart City: Technologies, Standards and Beyond

Deep learning has achieved substantial success in a series of tasks in computer vision. Intelligent video analysis, which can be broadly applied to video surveillance in various smart city applications, can also be driven by such powerful deep learning engines. To practically facilitate deep neural network models in the large-scale video analysis, there are still unprecedented challenges for th...

متن کامل

Deep Compression: Compressing Deep Neural Network with Pruning, Trained Quantization and Huffman Coding

Deep Compression is a three stage compression pipeline: pruning, quantization and Huffman coding. Pruning reduces the number of weights by 10x, quantization further improves the compression rate between 27x and 31x. Huffman coding gives more compression: between 35x and 49x. The compression rate already included the metadata for sparse representation. Deep Compression doesn’t incur loss of accu...

متن کامل

End-to-End Optimized Speech Coding with Deep Neural Networks

Modern compression algorithms are often the result of laborious domain-specific research; industry standards such as MP3, JPEG, and AMR-WB took years to develop and were largely hand-designed. We present a deep neural network model which optimizes all the steps of a wideband speech coding pipeline (compression, quantization, entropy coding, and decompression) end-to-end directly from raw speech...

متن کامل

Detection of Pitting in Gears Using a Deep Sparse Autoencoder

In this paper; a new method for gear pitting fault detection is presented. The presented method is developed based on a deep sparse autoencoder. The method integrates dictionary learning in sparse coding into a stacked autoencoder network. Sparse coding with dictionary learning is viewed as an adaptive feature extraction method for machinery fault diagnosis. An autoencoder is an unsupervised ma...

متن کامل

Reliable Flow Control Coding in Low-Buffer Grid Networks

We consider a grid network where nodes contain small buffers. A packet that faces a crowded buffer in its route will get extra latency and may be dropped. In this paper, we propose a novel flow control protocol called RFCC for grid networks. RFCC tries to reroute delayed packets and utilizes network coding to introduce a configurable amount of redundant information in the network, thereby incre...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010